Estimation of Large Covariance Matrices
نویسندگان
چکیده
This paper considers estimating a covariance matrix of p variables from n observations by either banding or tapering the sample covariance matrix, or estimating a banded version of the inverse of the covariance. We show that these estimates are consistent in the operator norm as long as (logp)/n→ 0, and obtain explicit rates. The results are uniform over some fairly natural well-conditioned families of covariance matrices. We also introduce an analogue of the Gaussian white noise model and show that if the population covariance is embeddable in that model and well-conditioned, then the banded approximations produce consistent estimates of the eigenvalues and associated eigenvectors of the covariance matrix. The results can be extended to smooth versions of banding and to non-Gaussian distributions with sufficiently short tails. A resampling approach is proposed for choosing the banding parameter in practice. This approach is illustrated numerically on both simulated and real data.
منابع مشابه
Structure of Wavelet Covariance Matrices and Bayesian Wavelet Estimation of Autoregressive Moving Average Model with Long Memory Parameter’s
In the process of exploring and recognizing of statistical communities, the analysis of data obtained from these communities is considered essential. One of appropriate methods for data analysis is the structural study of the function fitting by these data. Wavelet transformation is one of the most powerful tool in analysis of these functions and structure of wavelet coefficients are very impor...
متن کاملPositive-Definite 1-Penalized Estimation of Large Covariance Matrices
The thresholding covariance estimator has nice asymptotic properties for estimating sparse large covariance matrices, but it often has negative eigenvalues when used in real data analysis. To fix this drawback of thresholding estimation, we develop a positive-definite 1penalized covariance estimator for estimating sparse large covariance matrices. We derive an efficient alternating direction me...
متن کاملPositive-Definite l1-Penalized Estimation of Large Covariance Matrices
The thresholding covariance estimator has nice asymptotic properties for estimating sparse large covariance matrices, but it often has negative eigenvalues when used in real data analysis. To fix this drawback of thresholding estimation, we develop a positive-definite l1penalized covariance estimator for estimating sparse large covariance matrices. We derive an efficient alternating direction m...
متن کاملA Well-Conditioned and Sparse Estimation of Covariance and Inverse Covariance Matrices Using a Joint Penalty
We develop a method for estimating well-conditioned and sparse covariance and inverse covariance matrices from a sample of vectors drawn from a sub-Gaussian distribution in high dimensional setting. The proposed estimators are obtained by minimizing the quadratic loss function and joint penalty of `1 norm and variance of its eigenvalues. In contrast to some of the existing methods of covariance...
متن کاملNonparametric estimation of large covariance matrices of longitudinal data
Estimation of an unstructured covariance matrix is difficult because of its positivedefiniteness constraint. This obstacle is removed by regressing each variable on its predecessors, so that estimation of a covariance matrix is shown to be equivalent to that of estimating a sequence of varying-coefficient and varying-order regression models. Our framework is similar to the use of increasing-ord...
متن کاملCovariance Structures for Quantitative Genetic Analyses
INTRODUCTION Covariance matrices in quantitative genetic analyses have, by and large, been considered ‘unstructured’, i.e. for q random variables, there are q(q + 1)/2 distinct covariance components. This implies that the number of parameters to be estimated increases quadratically with the number of variables. Multivariate analyses involving more than a few traits have been hampered by computa...
متن کامل